# Bimodal Pre-training
Tapas Base Finetuned Tabfact
Apache-2.0
TAPAS is a BERT-like model based on the Transformer architecture, specifically designed for processing tabular data. It is pre-trained in a self-supervised manner on English Wikipedia table data and fine-tuned on the TabFact dataset to determine whether a sentence is supported or refuted by table content.
Question Answering System
Transformers English

T
google
6,669
1
Codebert Base Mlm
CodeBERT is a pre-trained model for programming languages and natural languages, based on the RoBERTa architecture and trained with the Masked Language Modeling (MLM) objective.
Large Language Model
C
microsoft
8,848
46
Featured Recommended AI Models